Jointly Informative Feature Selection
نویسندگان
چکیده
We propose several novel criteria for the selection of groups of jointly informative continuous features in the context of classification. Our approach is based on combining a Gaussian modeling of the feature responses, with derived upper bounds on their mutual information with the class label and their joint entropy. We further propose specific algorithmic implementations of these criteria which reduce the computational complexity of the algorithms by up to two-orders of magnitude, making these strategies tractable in practice. Experiments on multiple computer-vision data-bases, and using several types of classifiers, show that this class of methods outperforms state-of-the-art baselines, both in terms of speed and classification accuracy.
منابع مشابه
Jointly Informative Feature Selection Made Tractable by Gaussian Modeling
We address the problem of selecting groups of jointly informative, continuous, features in the context of classification and propose several novel criteria for performing this selection. The proposed class of methods is based on combining a Gaussian modeling of the feature responses with derived bounds on and approximations to their mutual information with the class label. Furthermore, specific...
متن کاملAdaptive Hypergraph Learning for Unsupervised Feature Selection
In this paper, we propose a new unsupervised feature selection method to jointly learn the similarity matrix and conduct both subspace learning (via learning a dynamic hypergraph) and feature selection (via a sparsity constraint). As a result, we reduce the feature dimensions using different methods (i.e., subspace learning and feature selection) from different feature spaces, and thus makes ou...
متن کاملFeature Selection Using Multi Objective Genetic Algorithm with Support Vector Machine
Different approaches have been proposed for feature selection to obtain suitable features subset among all features. These methods search feature space for feature subsets which satisfies some criteria or optimizes several objective functions. The objective functions are divided into two main groups: filter and wrapper methods. In filter methods, features subsets are selected due to some measu...
متن کاملSelecting Informative Traits for Multivariate Quantitative
20 A major consideration in multitrait analysis is which traits should be jointly analyzed. As a 21 common strategy, multitrait analysis is performed on either pairs of traits or on all of traits. To 22 fully exploit the power of multitrait analysis, we propose variable selection to choose a subset of 23 informative traits for multitrait quantitative trait locus (QTL) mapping. The proposed meth...
متن کاملActive Feature Acquisition with Supervised Matrix Completion
Feature missing is a serious problem in many applications, which may lead to low quality of training data and further significantly degrade the learning performance. While feature acquisition usually involves special devices or complex process, it is expensive to acquire all feature values for the whole dataset. On the other hand, features may be correlated with each other, and some values may ...
متن کامل